Gaussian processes

نویسنده

  • Chuong B. Do
چکیده

In these notes, we will talk about a different flavor of learning algorithms, known as Bayesian methods. Unlike classical learning algorithm, Bayesian algorithms do not attempt to identify “best-fit” models of the data (or similarly, make “best guess” predictions for new test inputs). Instead, they compute a posterior distribution over models (or similarly, compute posterior predictive distributions for new test inputs). These distributions provide a useful way to quantify our uncertainty in model estimates, and to exploit our knowledge of this uncertainty in order to make more robust predictions on new test points. We focus on regression problems, where the goal is to learn a mapping from some input space X = R of n-dimensional vectors to an output space Y = R of real-valued targets. In particular, we will talk about a kernel-based fully Bayesian regression algorithm, known as Gaussian process regression. The material covered in these notes draws heavily on many different topics that we discussed previously in class (namely, the probabilistic interpretation of linear regression, Bayesian methods, kernels, and properties of multivariate Gaussians). The organization of these notes is as follows. In Section 1, we provide a brief review of multivariate Gaussian distributions and their properties. In Section 2, we briefly review Bayesian methods in the context of probabilistic linear regression. The central ideas underlying Gaussian processes are presented in Section 3, and we derive the full Gaussian process regression model in Section 4.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Rate of Entropy for Gaussian Processes

In this paper, we show that in order to obtain the Tsallis entropy rate for stochastic processes, we can use the limit of conditional entropy, as it was done for the case of Shannon and Renyi entropy rates. Using that we can obtain Tsallis entropy rate for stationary Gaussian processes. Finally, we derive the relation between Renyi, Shannon and Tsallis entropy rates for stationary Gaussian proc...

متن کامل

Complete convergence of moving-average processes under negative dependence sub-Gaussian assumptions

The complete convergence is investigated for moving-average processes of doubly infinite sequence of negative dependence sub-gaussian random variables with zero means, finite variances and absolutely summable coefficients. As a corollary, the rate of complete convergence is obtained under some suitable conditions on the coefficients.

متن کامل

ADK Entropy and ADK Entropy Rate in Irreducible- Aperiodic Markov Chain and Gaussian Processes

In this paper, the two parameter ADK entropy, as a generalized of Re'nyi entropy, is considered and some properties of it, are investigated. We will see that the ADK entropy for continuous random variables is invariant under a location and is not invariant under a scale transformation of the random variable. Furthermore, the joint ADK entropy, conditional ADK entropy, and chain rule of this ent...

متن کامل

Properties of Spatial Cox Process Models

Probabilistic properties of Cox processes of relevance for statistical modeling and inference are studied. Particularly, we study the most important classes of Cox processes, including log Gaussian Cox processes, shot noise Cox processes, and permanent Cox processes. We consider moment properties and point process operations such as thinning, displacements, and superpositioning. We also discuss...

متن کامل

Quantum Gaussian Processes

This paper studies construction of quantum Gaussian processes based on ordinary Gaussian processes through their reproducing kernel Hilbert spaces, and investigate the relationship between the stochastic properties of the quantum Gaussian processes and the base Gaussian processes. In particular, we construct quantum Brownian bridges and quantum Ornstein-Uhlenbeck processes. Non-commutative stoc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007